Gelfand-Yaglom-Perez Theorem for Generalized Relative Entropies

نویسندگان

  • Ambedkar Dukkipati
  • Shalabh Bhatnagar
چکیده

The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem (Pinsker, 1960, Theorem. 2.4.2) which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Rényi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy. ‡ Corresponding author 2

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals

The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KLentropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. The...

متن کامل

On Generalized Measures of Information with Maximum and Minimum Entropy Prescriptions

Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...

متن کامل

Connecting the Chiral and Heavy Quark Limits : Full Mass Dependence of Fermion Determinant in an Instanton Background

This talk reports work done in collaboration with Jin Hur, Choonkyu Lee and Hyunsoo Min concerning the computation of the precise mass dependence of the fermion determinant for quarks in the presence of an instanton background. The result interpolates smoothly between the previously known chiral and heavy quark limits of extreme small and large mass. The computational method makes use of the fa...

متن کامل

Displacement convexity of generalized relative entropies. II

We introduce a class of generalized relative entropies (inspired by the Bregman divergence in information theory) on the Wasserstein space over a weighted Riemannian or Finsler manifold. We prove that the convexity of all the entropies in this class is equivalent to the combination of the non-negative weighted Ricci curvature and the convexity of another weight function used in the definition o...

متن کامل

On Generalized Csiszz Ar{kullback Inequalities

The classical Csiszz ar{Kullback inequality bounds the L 1 {distance of two probability densities in terms of their relative (convex) entropies. Here we generalize such inequalities to not necessarily normalized and possibly non-positive L 1 functions. Also, our generalized Csiszz ar{Kullback inequalities are in many important cases sharper than the classical ones (in terms of the functional de...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006